Home |
| Latest | About | Random
# 19 Linear independence, a first look. We have been studying redundancies in a list of vectors as a spanning set, we will now introduce the concept of **linear independence** which captures this idea of "no redundancies" among a list of vectors. Let us define it first. > **Definition of linear independence.** > Given a list (or set) of vectors $v_{1},v_{2},\ldots,v_{k}$, we say this list (or set) is **linearly independent** if the equation $$ c_{1}v_{1}+ c_{2}v_{2}+\cdots + c_{k}v_{k}=\mathbf{0} $$has **only** the trivial solution, namely, the only way to make this equation true is when all coefficients $c_{1}=c_{2}=\cdots = c_{k}=0$, and no other possible ways. **Remark.** Here **trivial solution** refers to the coefficients $c_{1},c_{2},\ldots,c_{k}$ are all zero. If we have found some choices of $c_{i}$ such that not all of them are zero to make above equation true, then we say it is a **nontrivial solution**. We also say the linear combination $0v_{1}+0v_{2}+\cdots + 0v_{k}$ is the **trivial linear combination**, and that if any one of $c_{i}$ is not zero, the linear combination $c_{1}v_{1}+c_2v_{2}+\cdots + c_{k}v_{k}$ is a **nontrivial linear combination**. So, a set of vectors $v_{1},\ldots,v_{k}$ is linearly independent means there is only the trivial linear combination to make the zero vector $\mathbf{0}$. **Remark.** Morally speaking, a list (or set) of vectors $v_{1},v_{2},\ldots,v_{k}$ is linearly independent if there is no redundancies among them. Meaning that using $\{v_{1},v_{2},\ldots,v_{k}\}$ as a spanning set for the $\operatorname{span}(v_{1},v_{2},\ldots,v_{k})$ is as economical as it gets -- that removing any one of them would change the span. Geometrically, if a list of vectors $v_{1},v_{2},\ldots,v_{k}$ is linearly independent, then "they are all pointing in a different direction, not reachable by combining others." Let us see some examples using the definition. **Example.** Is the list of vectors $\begin{bmatrix}1 \\2\end{bmatrix},\begin{bmatrix}2\\4\end{bmatrix}$ linearly independent? To determine this, we check what kind of solutions the equation $$ c_{1}\begin{bmatrix}1\\2\end{bmatrix} + c_{2}\begin{bmatrix}2\\4\end{bmatrix} = \begin{bmatrix}0\\0\end{bmatrix} $$admit. Notice that we can have a nontrivial solution with $c_{1}=-2$ and $c_{2}=1$, since $$ -2\begin{bmatrix}1\\2\end{bmatrix} + 1\begin{bmatrix}2\\4\end{bmatrix} = \begin{bmatrix}0\\0\end{bmatrix}, $$so this shows $\begin{bmatrix}1 \\2\end{bmatrix},\begin{bmatrix}2\\4\end{bmatrix}$ are **not linearly independent.** **Example.** Is the list of vectors $\begin{bmatrix}1 \\2\end{bmatrix},\begin{bmatrix}2\\2\end{bmatrix}$ linearly independent? To determine this, we check what kind of solutions the equation $$ c_{1}\begin{bmatrix}1\\2\end{bmatrix} + c_{2}\begin{bmatrix}2\\2\end{bmatrix} = \begin{bmatrix}0\\0\end{bmatrix} $$admits. This is equivalent to checking whether the augmented matrix $$ \begin{bmatrix}1 & 2 & \vdots & 0 \\ 2 & 2 & \vdots & 0\end{bmatrix} $$lead to a unique solution or not. Since this homogeneous equation always have the trivial solution as a solution, if it has a unique solution then it is the only solution, and there is no nontrivial solution. And indeed this augmented matrix leads to a unique solution, so we conclude we only have the trivial solution $c_{1}=c_{2}=0$. Hence $\begin{bmatrix}1 \\2\end{bmatrix},\begin{bmatrix}2\\2\end{bmatrix}$ are **linearly independent**. This leads to a **computational approach** to determine whether a set of column vectors is linearly independent or not: > **Computational note.** > Given a list of column vectors $v_{1},v_{2},\ldots,v_{k}$, they are **linearly independent** if and only if the matrix $[\ v_{1}\ |\ v_{2}\ |\cdots|\ v_{k}\ ]$ has an echelon form that has **pivot every column**, namely no free columns. Again, this makes sense because this is equivalent to saying the homogeneous equation $c_{1}v_{1}+c_{2}v_{2}+\cdots + c_{k}v_{k}=\vec 0$ (which is always consistent with the trivial solution) does not have any free variables, so we have unique solution, which is just the trivial solution, hence $v_{1},v_{2},\ldots,v_{k}$ are linearly independent. Note we didn't need to write the extra augmented column of zeros because it remains zero during row-reduction. **Example.** Is the list of vectors $\begin{bmatrix}1\\1\\1\\1\end{bmatrix},\begin{bmatrix}1\\1\\2\\2\end{bmatrix},\begin{bmatrix}2\\1\\2\\1\end{bmatrix}$ linearly independent? Using the computational result above, we row-reduce $$ \begin{bmatrix}1 & 1 & 2 \\ 1 & 1 & 1 \\ 1 & 2 & 2 \\ 1 & 2 & 1\end{bmatrix} \stackrel{\text{row}}\sim \begin{bmatrix}1 & 1 & 2 \\ 0 & 0 & -1 \\ 0 & 1 & 0 \\ 0 & 1 & -1\end{bmatrix}\stackrel{\text{row}}\sim \begin{bmatrix}1 & 1 & 2 \\ 0 & 1 & 0 \\ 0 & 0 & 1 \\ 0 & 0 & 0\end{bmatrix} $$which has pivot every column and no free variables, which implies $$ c_{1}\begin{bmatrix}1\\1\\1\\1\end{bmatrix}+c_{2}\begin{bmatrix}1\\1\\2\\2\end{bmatrix}+c_{3}\begin{bmatrix}2\\1\\2\\1\end{bmatrix} = \begin{bmatrix}0\\0\\0\\0\end{bmatrix} $$has only the trivial solution, hence they are **linearly independent**. --- Before we move on too far, we make couple remarks. **Remark.** What if we have an empty list or empty set of vectors? We make this definition: An empty list or empty set of vectors is **linearly independent**. So, $\{\ \ \}$ is linearly independent. This is a vacuously true definition, as there is no linear equations to check. **Remark.** What if we have a list of just one vector, or a singleton set of vector $\{\vec v\}$? Well we can work out when is this linearly independent or not. Using the definition, consider the equation $$ c\vec v = \vec 0 $$If $\vec v\neq \vec 0$, then $c$ must be $0$. Hence if $\vec v\neq 0$, the set $\{ \vec v \}$ is linearly independent. If $\vec v = \vec 0$, then we can take $c=1$, and have $1\vec v=1\cdot \vec 0=\vec 0$, a nontrivial solution. Hence the set $\{\vec 0\}$ is **not** linearly independent. **Remark.** In fact, whenever a list of vector where one of them is itself the zero vector, then the list would be **not** linearly independent. Think about why and try to prove it. **Remark.** Since we keep using the phrase "not linearly independent", we shall make the following definition. > **Definition.** A list of vectors $v_{1},v_{2},\ldots,v_{k}$ is said to be **linearly dependent** if it is not linearly independent. **Remark.** What if the list of vectors is infinite? In this situation we need to expand on the definition. An infinite list of vectors is linearly independent if **every finite sublist** of this infinite list **is linearly independent**. We need to make this definition so because taking linear combination is a finitary sum process. (If instead we really want to consider "infinite sum", then we really are talking about infinite series. In this case we enter the realm of calculus/analysis -- this field of infinite dimensional linear algebra is called **functional analysis**, which is a world filled with monsters and dragons and unsolved problems.) --- We close here for now for a characterization of linear independence, which adds to the meaning of "but what does it mean for a list of vectors to be linearly independent": > **A characterization of linear independence.** > A list of $k\ge 2$ many vectors $v_{1},v_{2},\ldots,v_{k}$ is linearly independent if and only if no one vector is a linear combination of the others. That is to say, you cannot make $v_{i}$ via a linear combination of the rest of vectors in that list. Hence it is in this sense each vector is "essential", or "nonredundant" if we were to use them as a spanning set. The proof of this will be found in the next note, [[smc-spring-2024-math-13/linear-algebra-notes/19a-proof-of-a-characterization-of-linear-independence|notes 19a]]. And additional notes and examples on linear independence will be found in [[smc-spring-2024-math-13/linear-algebra-notes/19b-more-examples-of-linear-independence|notes 19b]]. **Remark.** The concept of linear independence and spanning sets are **of paramount importance** in linear algebra. The linear algebra that we are dealing mostly with here are finite dimensional linear algebra, and this subject is "reasonably manageable" because of these two crucial concepts: linear independence and spanning sets. So make sure you understand it well. Of course, this is not going to be the last time we see them and we will continue to build our understanding of these concepts.